-
Notifications
You must be signed in to change notification settings - Fork 180
Add OpenTelemetry distributed tracing integration examples with OpenAI client #329
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
✅ Deploy Preview for vllm-semantic-router ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
Co-authored-by: rootfs <[email protected]>
@Xunzhuo @JaredforReal @yuluo-yx can you review it? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds comprehensive OpenTelemetry (OTEL) distributed tracing integration examples to demonstrate end-to-end observability from client applications through the semantic router to vLLM backends. The implementation provides practical, working examples showing how to implement distributed tracing across the entire LLM inference pipeline.
- Complete Python example with auto-instrumentation of OpenAI client and automatic trace context propagation
- Full Docker Compose stack with Jaeger for trace collection and visualization
- Enhanced documentation with detailed setup, troubleshooting, and production deployment guidance
Reviewed Changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated no comments.
Show a summary per file
File | Description |
---|---|
website/docs/tutorials/observability/distributed-tracing.md |
Enhanced documentation with end-to-end tracing examples and trace flow diagrams |
examples/distributed-tracing/router-config.yaml |
Example router configuration with OTLP exporter and sampling settings |
examples/distributed-tracing/requirements.txt |
Python dependencies for OpenTelemetry instrumentation |
examples/distributed-tracing/openai_client_tracing.py |
Complete Python example demonstrating auto-instrumentation and trace propagation |
examples/distributed-tracing/docker-compose.yml |
Docker Compose stack with Jaeger and semantic router |
examples/distributed-tracing/README.md |
Comprehensive 339-line guide with setup, troubleshooting, and production recommendations |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
👥 vLLM Semantic Team NotificationThe following members have been identified for the changed files in this PR and have been automatically assigned: 📁
|
Overview
This PR adds comprehensive OpenTelemetry (OTEL) distributed tracing integration examples to demonstrate end-to-end observability from client applications through the semantic router to vLLM backends. This addresses the requirement for practical examples showing how to implement distributed tracing across the entire LLM inference pipeline.
What's Added
Complete Python Example (
examples/distributed-tracing/
)A fully functional example demonstrating:
Auto-instrumentation of OpenAI Python Client:
Three practical scenarios:
Docker Compose Stack
Complete tracing stack with:
Configuration Files
Documentation Updates
Enhanced
website/docs/tutorials/observability/distributed-tracing.md
with:Trace Context Flow
Quick Start
Benefits
For Developers:
For Operations:
For Product Teams:
Validation
Files Changed
examples/distributed-tracing/openai_client_tracing.py
(NEW)examples/distributed-tracing/requirements.txt
(NEW)examples/distributed-tracing/docker-compose.yml
(NEW)examples/distributed-tracing/router-config.yaml
(NEW)examples/distributed-tracing/README.md
(NEW)website/docs/tutorials/observability/distributed-tracing.md
(UPDATED)Total: 6 files changed, 815 insertions(+), 6 deletions(-)
Closes #[issue_number]
Original prompt
💬 Share your feedback on Copilot coding agent for the chance to win a $200 gift card! Click here to start the survey.